NLP |
---|
MeSH D020557 |
Submodalities in neuro-linguistic programming are distinctions of form or structure (rather than content) within a sensory representational system. For example, regardless of the content, both external and mental images of any kind will be either colored or monochrome, and stationary or moving. These parameters are submodalities within the visual sense. Similarly, both remembered and actual sounds will be mono or stereo when experienced internally, so mono/stereo is a submodality of sound.
NLP asserts that far from being arbitrary or unimportant, these submodalities often perform a functional role, as a means by which emotions, related memories, felt-sense perceptions such as "importance", and so on, are presented to consciousness by the unconscious mind, along with thoughts or memories. The metaphor of "distancing oneself" is taken quite literally, the mental representation of something unimportant is "farther away" than something important.
NLP asserts that amongst the many possible submodalities, there will often be a handful of so-called "critical" submodalities which can functionally effect large-scale change, and that they differ between people, and can be identified by observation and inquiry. NLP states that a change within these critical submodalities will often correlate with a near-immediate subjective change in the emotion or other felt-sense with which a mental impression presents itself.
Submodalities are therefore seen in NLP as offering a valuable therapeutic insight (or metaphor) and potential working methods, into how the human mind internally organizes and subjectively 'views' events. Anthony Robbins, a motivational speaker and NLP proponent, states that "our ability to change the way we feel depends upon our ability to change our submodalities."[1]
Contents |
The concept of submodalities arose in the field of neuro-linguistic programming (NLP), that human beings 'code' internal experiences using aspects of their different senses.
Specifically for most people, research within NLP states that the brain often uses these structural elements as a way to 'know' how it feels about them, and what they signify internally. The link is stated to be bilateral - that is, emotions attached to a mental experience are affected by certain submodalities with which it is associated, and specific submodalities can also be affected if the emotional significance changes.
Submodalities refers to the subjective structural subdivisions within a given representational system. For example, in visual terms, common distinctions include: brightness, degree of colour (saturation), size, distance, sharpness, focus, and so on; in auditory: loudness, pitch, tonal range, distance, clarity, timbre, and so on.
Ordinarily, one can establish these by asking questions:
A more extensive list of common submodalities is given below.
According to core NLP research, each person's brain seems to code emotional significance differently through variations in mental "image" or representation. Examples found include people whose unconscious minds place black borders around bad memories, people for whom visual images seen dimly are less compelling than those seen brightly, people for whom a subjectively "good" memory is accompanied by one kind of sound whilst a "bad" memory is accompanied by another, and so on.
For most people, there will be a handful of such distinctions which are 'critical' to emotional perception, and thus to their mental processing. For example, these might be submodalities that distinguish optimistic thoughts from depressive ones, or which distinguish compelling and important thoughts from less compelling ones. For any given individual, a submodality that turns out to be critical in how a given memory or thought is subjectively experienced, is known as a critical submodality.
The discovery that the emotion associated with a thought is often functionally linked to the submodalities with which that thought is presented to consciousness, led to a variety of brief therapy NLP interventions based upon change of these key submodalities. In effect, voluntary change of submodalities on the part of the subject was often found to alter long-term the concomitant 'feeling' response, paving the way for a number of change techniques based on deliberately changing internal representations. NLP co-originator Richard Bandler in particular has made extensive use of submodality manipulations in the evolution of his work.
To match these subjective distinctions, Eric Robbie (an NLP trainer) demonstrated in 1984 that submodalities can be reliably distinguished from external behaviour - in the case of visual submodalities, subtle changes in the eye and facial muscles surrounding the eye are good indicators of specific visual submodalities; in the case of auditory, subtle changes in the muscles surrounding the ears perform the same function for auditory submodalities, and in the case of kinesthetic, subtle changes in the musculature of the body reveal subjective variations in that modality too.[2]
Examples of distinctions that are embedded within sensory impressions include:
Representation system | Examples of submodalities |
---|---|
Visual (sight, images, spatial) |
|
Auditory (sound, voice) |
|
Kinesthetic (propreceptive, somatic) |
|
Olfactory/Gustatory |